1,766 research outputs found

    Parallel Ada benchmarks for the SVMS

    Get PDF
    The use of parallel processing paradigm to design and develop faster and more reliable computers appear to clearly mark the future of information processing. NASA started the development of such an architecture: the Spaceborne VHSIC Multi-processor System (SVMS). Ada will be one of the languages used to program the SVMS. One of the unique characteristics of Ada is that it supports parallel processing at the language level through the tasking constructs. It is important for the SVMS project team to assess how efficiently the SVMS architecture will be implemented, as well as how efficiently Ada environment will be ported to the SVMS. AUTOCLASS II, a Bayesian classifier written in Common Lisp, was selected as one of the benchmarks for SVMS configurations. The purpose of the R and D effort was to provide the SVMS project team with the version of AUTOCLASS II, written in Ada, that would make use of Ada tasking constructs as much as possible so as to constitute a suitable benchmark. Additionally, a set of programs was developed that would measure Ada tasking efficiency on parallel architectures as well as determine the critical parameters influencing tasking efficiency. All this was designed to provide the SVMS project team with a set of suitable tools in the development of the SVMS architecture

    Ada in AI or AI in Ada. On developing a rationale for integration

    Get PDF
    The use of Ada as an Artificial Intelligence (AI) language is gaining interest in the NASA Community, i.e., by parties who have a need to deploy Knowledge Based-Systems (KBS) compatible with the use of Ada as the software standard for the Space Station. A fair number of KBS and pseudo-KBS implementations in Ada exist today. Currently, no widely used guidelines exist to compare and evaluate these with one another. The lack of guidelines illustrates a fundamental problem inherent in trying to compare and evaluate implementations of any sort in languages that are procedural or imperative in style, such as Ada, with those in languages that are functional in style, such as Lisp. Discussed are the strengths and weakness of using Ada as an AI language and a preliminary analysis provided of factors needed for the development of criteria for the integration of these two families of languages and the environments in which they are implemented. The intent for developing such criteria is to have a logical rationale that may be used to guide the development of Ada tools and methodology to support KBS requirements, and to identify those AI technology components that may most readily and effectively be deployed in Ada

    Definition and characterization of localised meningitis epidemics in Burkina Faso: a longitudinal retrospective study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The epidemiology of meningococcal meningitis in the African meningitis belt is characterised by seasonality, localised epidemics and epidemic waves. To facilitate research and surveillance, we aimed to develop a definition for localised epidemics to be used in real-time surveillance based on weekly case reports at the health centre level.</p> <p>Methods</p> <p>We used national routine surveillance data on suspected meningitis from January 2004 to December 2008 in six health districts in western and central Burkina Faso. We evaluated eight thresholds composed of weekly incidence rates at health centre level for their performance in predicting annual incidences of 0.4%and 0.8% in health centre areas. The eventually chosen definition was used to describe the spatiotemporal epidemiology and size of localised meningitis epidemics during the included district years.</p> <p>Results</p> <p>Among eight weekly thresholds evaluated, a weekly incidence rate of 75 cases per 100,000 inhabitants during at least two consecutive weeks with at least 5 cases per week had 100% sensitivity and 98% specificity for predicting an annual incidence of at least 0.8% in health centres. Using this definition, localised epidemics were identified in all but one years during 2004-2008, concerned less than 10% of the districts' population and often were geographically dispersed. Where sufficient laboratory data were available, localised epidemics were exclusively due to meningococci.</p> <p>Conclusions</p> <p>This definition of localised epidemics a the health centre level will be useful for risk factor and modelling studies to understand the meningitis belt phenomenon and help documenting vaccine impact against epidemic meningitis where no widespread laboratory surveillance exists for quantifying disease reduction after vaccination.</p

    Emergence of Epidemic Neisseria meningitidis Serogroup X Meningitis in Togo and Burkina Faso

    Get PDF
    Serogroup X meningococci (NmX) historically have caused sporadic and clustered meningitis cases in sub-Saharan Africa. To study recent NmX epidemiology, we analyzed data from population-based, sentinel and passive surveillance, and outbreak investigations of bacterial meningitis in Togo and Burkina Faso during 2006–2010. Cerebrospinal fluid specimens were analyzed by PCR. In Togo during 2006–2009, NmX accounted for 16% of the 702 confirmed bacterial meningitis cases. Kozah district experienced an NmX outbreak in March 2007 with an NmX seasonal cumulative incidence of 33/100,000. In Burkina Faso during 2007–2010, NmX accounted for 7% of the 778 confirmed bacterial meningitis cases, with an increase from 2009 to 2010 (4% to 35% of all confirmed cases, respectively). In 2010, NmX epidemics occurred in northern and central regions of Burkina Faso; the highest district cumulative incidence of NmX was estimated as 130/100,000 during March–April. Although limited to a few districts, we have documented NmX meningitis epidemics occurring with a seasonal incidence previously only reported in the meningitis belt for NmW135 and NmA, which argues for development of an NmX vaccine

    Energy Resolution Performance of the CMS Electromagnetic Calorimeter

    Get PDF
    The energy resolution performance of the CMS lead tungstate crystal electromagnetic calorimeter is presented. Measurements were made with an electron beam using a fully equipped supermodule of the calorimeter barrel. Results are given both for electrons incident on the centre of crystals and for electrons distributed uniformly over the calorimeter surface. The electron energy is reconstructed in matrices of 3 times 3 or 5 times 5 crystals centred on the crystal containing the maximum energy. Corrections for variations in the shower containment are applied in the case of uniform incidence. The resolution measured is consistent with the design goals

    Differential cross section measurements for the production of a W boson in association with jets in proton–proton collisions at √s = 7 TeV

    Get PDF
    Measurements are reported of differential cross sections for the production of a W boson, which decays into a muon and a neutrino, in association with jets, as a function of several variables, including the transverse momenta (pT) and pseudorapidities of the four leading jets, the scalar sum of jet transverse momenta (HT), and the difference in azimuthal angle between the directions of each jet and the muon. The data sample of pp collisions at a centre-of-mass energy of 7 TeV was collected with the CMS detector at the LHC and corresponds to an integrated luminosity of 5.0 fb[superscript −1]. The measured cross sections are compared to predictions from Monte Carlo generators, MadGraph + pythia and sherpa, and to next-to-leading-order calculations from BlackHat + sherpa. The differential cross sections are found to be in agreement with the predictions, apart from the pT distributions of the leading jets at high pT values, the distributions of the HT at high-HT and low jet multiplicity, and the distribution of the difference in azimuthal angle between the leading jet and the muon at low values.United States. Dept. of EnergyNational Science Foundation (U.S.)Alfred P. Sloan Foundatio

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Juxtaposing BTE and ATE – on the role of the European insurance industry in funding civil litigation

    Get PDF
    One of the ways in which legal services are financed, and indeed shaped, is through private insurance arrangement. Two contrasting types of legal expenses insurance contracts (LEI) seem to dominate in Europe: before the event (BTE) and after the event (ATE) legal expenses insurance. Notwithstanding institutional differences between different legal systems, BTE and ATE insurance arrangements may be instrumental if government policy is geared towards strengthening a market-oriented system of financing access to justice for individuals and business. At the same time, emphasizing the role of a private industry as a keeper of the gates to justice raises issues of accountability and transparency, not readily reconcilable with demands of competition. Moreover, multiple actors (clients, lawyers, courts, insurers) are involved, causing behavioural dynamics which are not easily predicted or influenced. Against this background, this paper looks into BTE and ATE arrangements by analysing the particularities of BTE and ATE arrangements currently available in some European jurisdictions and by painting a picture of their respective markets and legal contexts. This allows for some reflection on the performance of BTE and ATE providers as both financiers and keepers. Two issues emerge from the analysis that are worthy of some further reflection. Firstly, there is the problematic long-term sustainability of some ATE products. Secondly, the challenges faced by policymakers that would like to nudge consumers into voluntarily taking out BTE LEI

    Penilaian Kinerja Keuangan Koperasi di Kabupaten Pelalawan

    Full text link
    This paper describe development and financial performance of cooperative in District Pelalawan among 2007 - 2008. Studies on primary and secondary cooperative in 12 sub-districts. Method in this stady use performance measuring of productivity, efficiency, growth, liquidity, and solvability of cooperative. Productivity of cooperative in Pelalawan was highly but efficiency still low. Profit and income were highly, even liquidity of cooperative very high, and solvability was good
    corecore